|
An atomic clock is a clock device that uses an electronic transition frequency in the microwave, optical, or ultraviolet region of the electromagnetic spectrum of atoms as a frequency standard for its timekeeping element. Atomic clocks are the most accurate time and frequency standards known, and are used as primary standards for international time distribution services, to control the wave frequency of television broadcasts, and in global navigation satellite systems such as GPS. The principle of operation of an atomic clock is not based on nuclear physics, but rather on atomic physics; it uses the microwave signal that electrons in atoms emit when they change energy levels. Early atomic clocks were based on masers at room temperature. Currently, the most accurate atomic clocks first cool the atoms to near absolute zero temperature by slowing them with lasers and probing them in atomic fountains in a microwave-filled cavity. An example of this is the NIST-F1 atomic clock, one of the U.S.'s national primary time and frequency standards. The accuracy of an atomic clock depends on two factors. The first factor is temperature of the sample atoms—colder atoms move much more slowly, allowing longer probe times. The second factor is the frequency and intrinsic width of the electronic transition. Higher frequencies and narrow lines increase the precision. National standards agencies in many countries maintain a network of atomic clocks which are intercompared and kept synchronized to an accuracy of 10−9 seconds per day (approximately 1 part in 1014). These clocks collectively define a continuous and stable time scale, International Atomic Time (TAI). For civil time, another time scale is disseminated, Coordinated Universal Time (UTC). UTC is derived from TAI, but approximately synchronised, by using leap seconds, to UT1, which is based on actual rotation of the Earth with respect to the solar time. ==History== The idea of using atomic transitions to measure time was first suggested by Lord Kelvin in 1879.〔Sir William Thomson (Lord Kelvin) and Peter Guthrie Tait, ''Treatise on Natural Philosophy'', 2nd ed. (Cambridge, England: Cambridge University Press, 1879), vol. 1, part 1, (page 227 ).〕 Magnetic resonance, developed in the 1930s by Isidor Rabi, became the practical method for doing this.〔 〕 In 1945, Rabi first publicly suggested that atomic beam magnetic resonance might be used as the basis of a clock.〔See: * Isidor I. Rabi, "Radiofrequency spectroscopy" (Richtmyer Memorial Lecture, delivered at Columbia University in New York, New York, on 20 January 1945). * "Meeting at New York, January 19 and 20, 1945" ''Physical Review'', vol. 67, pages 199-204 (1945). * William L. Laurence, "'Cosmic pendulum' for clock planned," ''New York Times'', 21 January 1945, page 34. Reprinted on page 77 of: Lombardi, Michael A.; Heavner, Thomas P.; and Jefferts, Steven R. (December 2007) ("NIST primary frequency standards and the realization of the SI second," ) ''NCSLI Measure'', vol. 2, no. 4, pages 74-89.〕 The first atomic clock was an ammonia maser device built in 1949 at the U.S. National Bureau of Standards (NBS, now NIST). It was less accurate than existing quartz clocks, but served to demonstrate the concept.〔 〕 The first accurate atomic clock, a caesium standard based on a certain transition of the caesium-133 atom, was built by Louis Essen and Jack Parry in 1955 at the National Physical Laboratory in the UK.〔 〕 Calibration of the caesium standard atomic clock was carried out by the use of the astronomical time scale ''ephemeris time'' (ET).〔 〕 This led to the internationally agreed definition of the latest SI second being based on atomic time. Equality of the ET second with the (atomic clock) SI second has been verified to within 1 part in 1010.〔 . Pages 413–414, gives the information that the SI second was made equal to the second of ephemeris time as determined from lunar observations, and was later verified in this relation, to 1 part in 1010.〕 The SI second thus inherits the effect of decisions by the original designers of the ephemeris time scale, determining the length of the ET second. Since the beginning of development in the 1950s, atomic clocks have been based on the hyperfine transitions in hydrogen-1, caesium-133, and rubidium-87. The first commercial atomic clock was the Atomichron, manufactured by the National Company. More than 50 were sold between 1956 and 1960. This bulky and expensive instrument was subsequently replaced by much smaller rack-mountable devices, such as the Hewlett-Packard model 5060 caesium frequency standard, released in 1964.〔 In the late 1990s four factors contributed to major advances in clocks:〔 〕 *Laser cooling and trapping of atoms *So-called high-finesse Fabry–Pérot cavities for narrow laser line widths *Precision laser spectroscopy *Convenient counting of optical frequencies using optical combs. In August 2004, NIST scientists demonstrated a chip-scale atomic clock.〔 Available on-line at: (NIST.gov )〕 According to the researchers, the clock was believed to be one-hundredth the size of any other. It requires no more than 125 mW,〔 (【引用サイトリンク】title=SA.45s CSAC Chip Scale Atomic Clock )〕 making it suitable for battery-driven applications. This technology became available commercially in 2011.〔 Ion trap experimental optical clocks are more precise than the current caesium standard. In 2016, NASA plans to deploy the Deep Space Atomic Clock (DSAC), a miniaturized, ultra-precise mercury-ion atomic clock, into outer space. The DSAC is considered much more stable than other current navigational clocks. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「atomic clock」の詳細全文を読む スポンサード リンク
|